Shoreline Transformational Adaptation: Forward Looking Nature-based Climate Resilience¶
Our proposed framework recognizes the fundamental need for innovation and entrepreneurship to create the next generation of nature-inspired innovative coastal protection systems provided by the RISE Rural and Urban Coastal Community Resilience Challenge projects. To support this effort, the Mason Flood Hazards Research Lab, the Center for Ocean-Land-Atmosphere Studies and the Business for a Better World Center will: 1) develop engineering and scientific validation of the RISE innovative solutions providing the foundation for widespread implementation for coastal protection at scale; 2) provide support for a forwardlooking design strategy ensuring that the proposed alternatives are resilient to extreme events and future climate conditions; and 3) provide support to translate pilot studies to a greater market and communities and to mitigate barriers related to social acceptance and adoption.
Project Objectives¶
Evaluate the performance of innovative types of nature-based solutions based on field-scale prototypes and environmental measurements and enhance the current understanding of how boat wakes and wind waves are attenuated by these interventions.
Determine how these nature-based solutions help address shoreline erosion under a range of conditions including high-frequency events and extreme events.
Evaluate the performance of innovative nature-based solutions for shoreline protection under a range of future climate conditions.
import warnings;warnings.filterwarnings("ignore")
import pathlib as pl; import numpy as np
import netCDF4 as nc4; import pandas as pd
import matplotlib.pyplot as plt; import xarray as xr
import json; import requests
import plotly.graph_objs as go; import folium
import geopandas as gpd;
from random import randint
from chart_studio import plotly
from plotly.offline import plot,iplot, init_notebook_mode
from shapely import Polygon,Point,LineString
from rosely import WindRose
Initialize¶
root = pl.Path('/Users/tmiesse/work/FHRL/seagrant/field')
Task 1: Evaluation of the performance of innovative types of nature-based solutions based on field-scale prototypes and environmental measurements¶
Field sites¶
Canal¶
c_sensors = gpd.read_file(root / 'gps' / 'canal0721_processed' / 'canal_wgs' / 'points_wgs.shp')
c_sensors
m
Transect 1 - East Bank: No Wave Data¶
Transect 1 - East Bank: Currents (Change to real adcp in canal)¶
fig.update_layout(
paper_bgcolor='rgba(0,0,0,0)',
plot_bgcolor='rgba(0,0,0,0)',
margin=dict(l=1, r=1, t=1, b=1),
height=400,width=450,legend=dict(title='Current [m/s]'))
#fig
Transect 2 - East Bank: Waves¶
fig.show()
Transect 3 - West Bank: Waves¶
fig.show()
Transect 4 - West Bank: Waves (Missing ODU RBRs 201398,211572)¶
fig.show()
Transect 5 - West Bank: Waves¶
fig.show()
Transect 6 - North Bank: Waves¶
fig.show()
Canal - All sensors¶
fig.show()
Marsh¶
m
Transect 1 - Rock Sill: Waves¶
fig.show()
Transect 2 - Marsh Sill Gap: Waves¶
fig.show()
Transect 2 - Marsh Sill Gap: Currents¶
fig.update_layout(
paper_bgcolor='rgba(0,0,0,0)',
plot_bgcolor='rgba(0,0,0,0)',
margin=dict(l=1, r=1, t=1, b=1),
height=400,width=450,legend=dict(title='Current [m/s]'))
Transect 3 - Oyster Sill: Waves¶
fig.show()
Transect 4 - New Marsh/oyster¶
fig.show()
Marsh Site - All transects¶
fig.show()
Atmospheric Pressure¶
Atmoshperic Pressure Comparison¶
fnames = ['hobo0721/processed/10754554.csv',
'hobo_0922/02_Processed/10754555.csv']
station = '8637689'
sensor = []
for f in fnames:
file = pd.read_csv(root / 'atmosphere' / f, skiprows=1)
atm_dt = pd.to_datetime(file['Date Time, GMT-04:00'])
for c in file.columns:
if 'Abs Pres' in c:
atm = file[c]
sensor.append(go.Scatter(x=atm_dt,y=atm*0.689476,
name='FHRL'))
start = str(atm_dt.iloc[0].year) + '{:02d}'.format(atm_dt.iloc[0].month) + '{:02d}'.format(atm_dt.iloc[0].day)
end = str(atm_dt.iloc[-1].year) + '{:02d}'.format(atm_dt.iloc[-1].month) + '{:02d}'.format(atm_dt.iloc[-1].day)
file = json.loads(noaa_data(start,end,station,interval='h',product='air_pressure'))
noaa_atm = [float(data['v'])/100 for data in file['data']]
noaa_dt = [pd.to_datetime(data['t']) for data in file['data']]
sensor.append(go.Scatter(x=noaa_dt,y=noaa_atm,
name='NOAA Yorktown'))
fig = go.Figure(data=sensor)
fig.update_layout(
yaxis_title='Abs. Pressure [dBar]',
margin=dict(l=10, r=10, t=10, b=10),
font=dict(family='Times New Roman'))
fig.show()
Wave Data¶
Marsh¶
file = xr.open_dataset(root / 'waves' / 'rbr' / '0922' / 'processed3' / 'GMU08.nc')
file
<xarray.Dataset>
Dimensions: (Time: 16285, lat: 1, lon: 1, ele: 1, fc: 177,
RTime: 41336270)
Coordinates:
* Time (Time) float64 0.0 0.0 0.0 ... 7.391e+05 7.391e+05 7.391e+05
* lat (lat) float64 37.32
* lon (lon) float64 -76.43
* ele (ele) float64 -1.843
* fc (fc) float64 0.06055 0.06445 0.06836 ... 0.7402 0.7441 0.748
* RTime (RTime) float64 7.391e+05 7.391e+05 ... 7.391e+05 7.391e+05
Data variables:
Hs (Time) float64 ...
Tp (Time) timedelta64[ns] ...
Sf (fc, Time) float64 ...
depth (Time) float64 ...
Water_Pressure (RTime) float64 ...
atm_pressure (RTime) float64 ...Canal¶
ADCP Marsh¶
file = pd.read_csv(root / 'waves' / 'adcp_marsh' / 'ADCP105.csv',delimiter=';')
file.head()
| DateTime | Battery | Heading | Pitch | Roll | Pressure | Temperature | AnalogIn1 | AnalogIn2 | Speed#1(0.115m) | ... | Speed#88(1.420m) | Dir#88(1.420m) | Speed#89(1.435m) | Dir#89(1.435m) | Speed#90(1.450m) | Dir#90(1.450m) | Speed#91(1.465m) | Dir#91(1.465m) | Speed#92(1.480m) | Dir#92(1.480m) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 7/21/2023 07:00:00 | 11.6 | 2.4 | 155.8 | -162.1 | 0.240 | 26.58 | 0 | 0 | 0.657 | ... | 0.391 | 222.20 | 0.686 | 243.06 | 0.497 | 242.82 | 0.671 | 230.38 | 0.466 | 222.56 |
| 1 | 7/21/2023 07:10:00 | 11.6 | 2.5 | 155.9 | -162.0 | 0.239 | 26.54 | 0 | 0 | 0.445 | ... | 0.540 | 230.33 | 0.649 | 234.91 | 0.540 | 224.02 | 0.553 | 221.70 | 0.500 | 226.38 |
| 2 | 7/21/2023 07:20:00 | 11.6 | 352.7 | 155.9 | -162.0 | 0.247 | 26.47 | 0 | 0 | 0.250 | ... | 0.298 | 230.59 | 0.457 | 232.03 | 0.301 | 277.24 | 0.377 | 255.89 | 0.239 | 256.43 |
| 3 | 7/21/2023 07:30:00 | 11.6 | 350.3 | 155.9 | -161.9 | 0.251 | 26.27 | 0 | 0 | 0.372 | ... | 0.397 | 277.23 | 0.316 | 310.25 | 0.258 | 260.39 | 0.210 | 275.19 | 0.154 | 311.05 |
| 4 | 7/21/2023 07:40:00 | 11.6 | 355.0 | 155.7 | -161.9 | 0.246 | 26.03 | 0 | 0 | 0.368 | ... | 0.527 | 77.94 | 0.504 | 96.49 | 0.687 | 97.02 | 0.426 | 96.06 | 0.495 | 101.06 |
5 rows × 193 columns
sname, dname = [],[]
for f in file.columns:
if 'Speed' in f:
sname.append(f)
if 'Dir' in f:
dname.append(f)
avg_speed = file[sname].mean(axis=1)
avg_dir = file[dname].mean(axis=1)
for i in range(len(avg_dir)):
avg_dir[i] = avg_dir[i]+180
if avg_dir[i]>359:
avg_dir[i] = avg_dir[i]-360
df[['ws','wd']].describe()
| ws | wd | |
|---|---|---|
| count | 9097.000000 | 9097.000000 |
| mean | 0.052522 | 202.581383 |
| std | 0.063466 | 151.447503 |
| min | 0.000293 | -0.982174 |
| 25% | 0.005620 | 23.766087 |
| 50% | 0.030837 | 303.947609 |
| 75% | 0.087804 | 338.590326 |
| max | 0.739772 | 358.997500 |
f = folium.Figure(width=800, height=400)
m = folium.Map(location=[37.326282, -76.429166],zoom_start=15).add_to(f)
folium.TileLayer(
tiles = 'https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{z}/{y}/{x}',
attr = 'Esri',
name = 'Esri Satellite',
overlay = False,
control = True
).add_to(m)
fg1 = folium.FeatureGroup('Marsh Site').add_to(m)
for i in range(0,len(m_sensors)):
folium.CircleMarker((m_sensors.geometry[i].y,m_sensors.geometry[i].x),color ='#CC79A7',
fill_color='#CC79A7',fill_opacity=0.7,radius=3,weight=2,popup=m_sensors['Name'][i]).add_to(fg1)
fg2 = folium.FeatureGroup('Canal Site').add_to(m)
folium.LayerControl().add_to(m)
m
def noaa_data(begin,end,station,vdatum='NAVD',interval='6',
form='json',t_zone='GMT',unit='metric',product='water_level'):
api = f'https://tidesandcurrents.noaa.gov/api/prod/datagetter?begin_date={begin}&end_date={end}&station={station}'\
f'&product={product}&application=NOS.COOPS.TAC.WL&datum={vdatum}&interval={interval}&time_zone={t_zone}&units={unit}&format={form}'
data = requests.get(url=api).content.decode()
return data
from datetime import timedelta, datetime
def datenum_to_datetime(datenum):
"""
Convert Matlab datenum into Python datetime.
:param datenum: Date in datenum format
:return: Datetime object corresponding to datenum.
"""
if datenum < 1:
temp = np.nan
else:
days = datenum % 1
hours = days % 1 * 24
minutes = hours % 1 * 60
seconds = minutes % 1 * 60
temp = datetime.fromordinal(int(datenum)) \
+ timedelta(days=int(days)) \
+ timedelta(hours=int(hours)) \
+ timedelta(minutes=int(minutes)) \
+ timedelta(seconds=round(seconds)) \
- timedelta(days=366)
return temp